2.2. Clearing Any Preexisting Data
Let's clear any data that
exists in the SQL Azure database so you can start with a clean slate. In
SSMS, open a new query connecting to the TechBio database, and delete
the data from the UserDocs, Users, and Docs tables by executing the
following DELETE statements:
DELETE FROM UserDocs
DELETE FROM Docs
DELETE FROM Users
All three of those tables should now contain no data, as shown in Figure 8.
2.3. Building a Migration Package
Let's start building an SSIS package to migrate your data. Follow these steps:
In
the SSIS package designer, select the Control Flow tab, and drag an
Execute SQL task and three data flow tasks from the Toolbox onto the
Control Flow designer.
Right-click Execute SQL task, and select Edit from the context menu. The Execute SQL Task Editor opens.
Change the task name to Clear Data, leave the Connection Type as OLE DB, and leave the SQLSourceType as Direct Input.
In the Connection property, select <New connection>, as shown in Figure 9. The Connection Manager dialog opens (see Figure 10).
In
the Connection Manager dialog, enter the server name of your SQL Azure
server, select Use SQL Authentication, and enter your SQL Azure username
and password. The username must be in the format username@server where username is your Administrator username or a valid SQL Azure username and server is the first part of the server name (the piece prior to .database.windows.net).
In
the "Select or enter a database name" field, the OLE DB provider
doesn't return a list of databases. No big deal: type in the name TechBio.
Click
the Test Connection button. If you've entered everything correctly and
your firewall is set up, your test connection succeeds.
Click OK in the Connection Manager dialog.
Back
in the Execute SQL Task Editor, click the ellipsis button in the
SQLStatement property to display a small Enter SQL Query dialog in which
to enter one or more T-SQL statements. Enter the following DELETE
statements, which clear out the data from the previous example. (This
isn't critical, but it gives you a clean slate to start with.)
DELETE FROM UserDocs
DELETE FROM Users
DELETE FROM Docs
Click OK in the Enter SQL Query dialog. The Execute SQL Task Editor dialog should look like Figure 11. Click OK.
Back
on the Control Flow tab in the SSIS package designer, make sure the
Clear Data task is highlighted. Notice the green arrow coming from the
bottom of the Clear Data task: click anywhere on this green arrow, and
drag it to the first data flow task. Doing so creates a connection from
the Clear Data task to the first data flow task, signifying the flow of
execution. You can see an example of this in Figure 5-14.When the Clear Data task has completed executing, the first data flow task will then execute.
Let's
add logic to the first data flow task. Double-click the linked data
flow task (or right-click and select Edit). Doing so takes you to the
Data Flow tab.
Drag
an OLE DB Source task and an OLE DB Destination task to the Data Flow
designer surface. This is where the actions of pulling data from the
source database (the local DB) and copying it to the destination
database (SQL Azure) take place.
Right-click
the OLE DB Source task, and click Edit. Doing so opens the OLE DB
Source Editor, where you define a connection to your local database,
such as the connection shown in Figure 5-10.
You already have a connection to the SQL Azure database, but you need
to create a connection to your local database that your tasks use to
copy the data.
On
the OLE DB Source Editor task, you see the connection to the SQL Azure
database. Click the New button to open the Configure OLE DB Connection
Manager dialog. Click New again to open the Connection Manager dialog
you saw in Figure 10.
In this dialog, enter the information to connect to your local copy of the TechBio database.
Test the connection, and then click OK in the Connection Manager dialog.
Click OK in the Configure OLE DB Connection Manager dialog.
Back in the OLE DB Source Editor, click the Table/View drop-down, select the Docs table, and then click OK.
As you did for the control flow task, drag the green arrow from the OLE DB Source task to the OLE DB Destination task.
Double-click
the OLE DB Source task to edit the task properties, which is where the
data is going: the SQL Azure database. Because you've already created a
connection to the SQL Azure database, you can use that connection. In
the OLE DB Destination Editor, select the SQL Azure connection, and then
select the Docs table from the drop-down list of tables. Oops—you get
the error shown in Figure 12.
This is interesting, because
you didn't get this error when configuring the Execute SQL task. The
difference is that the two Connection Manager dialogs don't operate
quite the same way. The Connection Manager dialog for the Execute SQL
task let you type in the table name, whereas the Connection Manager
dialog for the OLE DB Destination task required you to select from the
list. But when you expanded the list, you received the error shown in Figure 12.
The fix is to use an ADO.NET destination instead of the OLE DB destination. To do this, continue as follows:
Delete the OLE DB Destination task, and drag an ADO.NET Destination task onto the surface.
Connect to the two tasks, and then double-click the ADO.NET Destination task to configure it.
In the ADO.NET Destination Editor dialog, click the New button to configure a new ADO.NET connection.
Walk
through the same steps as in the previous two connection
configurations. This time, you're able to type the database name in the
Connection Manager dialog.
Click OK in the all the dialogs until you're back to the ADO.NET Destination Editor.
Before you click OK in this dialog, click Mappings at left, as shown in Figure 13.
Doing so ensures that the source table columns are appropriately mapped
to the destination table columns. Click OK in the ADO.NET Destination
Editor
If you're new to SSIS, congratulations: you've just configured your first SSIS data flow. Your data flow should look like Figure 14—not
very exciting, but useful nonetheless. If you aren't new to SSIS, you
still deserve congratulations, because you've successfully configured a
data flow to migrate data to the cloud.
Put down the root beer, though, because you aren't quite done. Continue with these steps:
Go
back to the Control Flow tab, and connect the first data flow task to
the second data flow task, and connect the second and third data flow
tasks.
Double-click
the second data flow task, and do the same thing you did for the first
data flow beginning in step 12. You don't need to re-create the
connections, but in the Source and Destination Editors for the second
data flow, select the Users table.
Repeat the process for the third data flow task, this time selecting the UserDocs table.
When you're finished, your control flow task should look like Figure 15. The tasks aren't green (yet), but the flow should be similar.